Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
We propose a new adaptive feedback cancellation (AFC) system in hearing aids (HAs) based on a well-posed optimization criterion that jointly considers both decorrelation of the signals and sparsity of the underlying channel. We show that the least squares criterion on subband errors regularized by a p-norm-like diversity measure can be used to simultaneously decorrelate the speech signals and exploit sparsity of the acoustic feedback path impulse response. Compared with traditional subband adaptive filters that are not appropriate for incorporating sparsity due to shorter sub-filters, our proposed framework is suitable for promoting sparse characteristics, as the update rule utilizing subband information actually operates in the fullband. Simulation results show that the normalized misalignment, added stable gain, and other objective metrics of the AFC are significantly improved by choosing a proper sparsity promoting factor and a suitable number of subbands. More importantly, the results indicate that the benefits of subband decomposition and sparsity promoting are complementary and additive for AFC in HAs.more » « less
-
While deep neural networks (DNNs) have achieved state-of-the-art results in many fields, they are typically over-parameterized. Parameter redundancy, in turn, leads to inefficiency. Sparse signal recovery (SSR) techniques, on the other hand, find compact solutions to over-complete linear problems. Therefore, a logical step is to draw the connection between SSR and DNNs. In this paper, we explore the application of iterative reweighting methods popular in SSR to learning efficient DNNs. By efficient, we mean sparse networks that require less computation and storage than the original, dense network. We propose a reweighting framework to learn sparse connections within a given architecture without biasing the optimization process, by utilizing the affine scaling transformation strategy. The resulting algorithm, referred to as Sparsity-promoting Stochastic Gradient Descent (SSGD), has simple gradient-based updates which can be easily implemented in existing deep learning libraries. We demonstrate the sparsification ability of SSGD on image classification tasks and show that it outperforms existing methods on the MNIST and CIFAR-10 datasets.more » « less
-
In this paper, a novel way of deriving proportionate adaptive filters is proposed based on diversity measure minimization using the iterative reweighting techniques well-known in the sparse signal recovery (SSR) area. The resulting least mean square (LMS)-type and normalized LMS (NLMS)-type sparse adaptive filtering algorithms can incorporate various diversity measures that have proved effective in SSR. Furthermore, by setting the regularization coefficient of the diversity measure term to zero in the resulting algorithms, Sparsity promoting LMS (SLMS) and Sparsity promoting NLMS (SNLMS) are introduced, which exploit but do not strictly enforce the sparsity of the system response if it already exists. Moreover, unlike most existing proportionate algorithms that design the step-size control factors based on heuristics, our SSR-based framework leads to designing the factors in a more systematic way. Simulation results are presented to demonstrate the convergence behavior of the derived algorithms for systems with different sparsity levels.more » « less
-
We show that a new design criterion, i.e., the least squares on subband errors regularized by a weighted norm, can be used to generalize the proportionate-type normalized subband adaptive filtering (PtNSAF) framework. The new criterion directly penalizes subband errors and includes a sparsity penalty term which is minimized using the damped regularized Newton’s method. The impact of the proposed generalized PtNSAF (GPtNSAF) is studied for the system identification problem via computer simulations. Specifically, we study the effects of using different numbers of subbands and various sparsity penalty terms for quasi-sparse, sparse, and dispersive systems. The results show that the benefit of increasing the number of subbands is larger than promoting sparsity of the estimated filter coefficients when the target system is quasi-sparse or dispersive. On the other hand, for sparse target systems, promoting sparsity becomes more important. More importantly, the two aspects provide complementary and additive benefits to the GPtNSAF for speeding up convergence.more » « less
-
Acoustic feedback control continues to be a challenging prob- lem due to the emerging form factors in advanced hearing aids (HAs) and hearables. In this paper, we present a novel use of well-known all-pass filters in a network to perform frequency warping that we call “freping.” Freping helps in breaking the Nyquist stability criterion and improves adaptive feedback can- cellation (AFC). Based on informal subjective assessments, dis- tortions due to freping are fairly benign. While common ob- jective metrics like the perceptual evaluation of speech quality (PESQ) and the hearing-aid speech quality index (HASQI) may not adequately capture distortions due to freping and acoustic feedback artifacts from a perceptual perspective, they are still instructive in assessing the proposed method. We demonstrate quality improvements with freping for a basic AFC (PESQ: 2.56 to 3.52 and HASQI: 0.65 to 0.78) at a gain setting of 20; and an advanced AFC (PESQ: 2.75 to 3.17 and HASQI: 0.66 to 0.73) for a gain of 30. From our investigations, freping provides larger improvement for basic AFC, but still improves overall system performance for many AFC approaches.more » « less
An official website of the United States government
